|
In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors. ==Eigenvalues and eigenvectors== Given an square matrix of real or complex numbers, an ''eigenvalue'' and its associated ''generalized eigenvector'' are a pair obeying the relation : where is a nonzero column vector, is the identity matrix, is a positive integer, and both and are allowed to be complex even when is real. When , the vector is called simply an ''eigenvector'', and the pair is called an ''eigenpair''. In this case, . Any eigenvalue of has ordinary〔The term "ordinary" is used here only to emphasize the distinction between "eigenvector" and "generalized eigenvector".〕 eigenvectors associated to it, for if is the smallest integer such that for a generalized eigenvector , then is an ordinary eigenvector. The value can always be taken as less than or equal to . In particular, for all generalized eigenvectors associated with For each eigenvalue of , the kernel consists of all eigenvectors associated with (along with 0), called the ''eigenspace'' of , while the vector space consists of all generalized eigenvectors, and is called the ''generalized eigenspace''. The ''geometric multiplicity'' of is the dimension of its eigenspace. The ''algebraic multiplicity'' of is the dimension of its generalized eigenspace. The latter terminology is justified by the equation : where is the determinant function, the are all the distinct eigenvalues of and the are the corresponding algebraic multiplicities. The function is the ''characteristic polynomial'' of . So the algebraic multiplicity is the multiplicity of the eigenvalue as a zero of the characteristic polynomial. Since any eigenvector is also a generalized eigenvector, the geometric multiplicity is less than or equal to the algebraic multiplicity. The algebraic multiplicities sum up to , the degree of the characteristic polynomial. The equation is called the ''characteristic equation'', as its roots are exactly the eigenvalues of . By the Cayley-Hamilton theorem, itself obeys the same equation: 〔where the constant term is multiplied by the identity matrix .〕 As a consequence, the columns of the matrix must be either 0 or generalized eigenvectors of the eigenvalue , since they are annihilated by In fact, the column space is the generalized eigenspace of Any collection of generalized eigenvectors of distinct eigenvalues is linearly independent, so a basis for all of can be chosen consisting of generalized eigenvectors. More particularly, this basis }} can be chosen and organized so that : * if and have the same eigenvalue, then so does for each between and , and : * if is not an ordinary eigenvector, and if is its eigenvalue, then (in particular, must be an ordinary eigenvector). If these basis vectors are placed as the column vectors of a matrix , then can be used to convert to its Jordan normal form: : where the are the eigenvalues, if and otherwise. More generally, if is any invertible matrix, and is an eigenvalue of with generalized eigenvector , then . Thus is an eigenvalue of with generalized eigenvector . That is, similar matrices have the same eigenvalues. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Eigenvalue algorithm」の詳細全文を読む スポンサード リンク
|